New e-markets try in a number of ways to attract a critical mass of participation and usage. Two innovative, all-electronic options exchanges, the International Securities Exchange (ISE) and the Boston Options Exchange (BOX), opened for trading in 2000 and 2004. In contrast to rival floor markets, they offer immediate order execution, direct user access, and reduced costs. As a result, ISE and BOX grew trading volumes and won market share from four incumbent exchanges in the United States. We observe significant differences between broker order-routing practices across ISE and BOX, leading to the markets’ different growth patterns. We develop and test hypotheses about new market growth using a panel of six years of quarterly disclosures from 24 major brokerage firms. We find that membership affiliations are the dominant force in predicting brokers’ order-routing patterns. In contrast to prior research, network externalities, as measured by an exchange’s previous quarter market share, are not significant predictors after controlling for temporal heterogeneity. From our results, executives of new electronic exchanges should concentrate on developing broker exchange affiliation and incentive schemes in order to achieve sustainable order levels. Furthermore, keeping a keen eye on the competitive landscape and reacting to changes in current and prospective competitors’ affiliation structures may prove the most beneficial way to ensure continued success. Top management must identify the relative advantages of new entrants’ affiliation structures and respond accordingly. A new entrant that provides incentives through a novel affiliation structure can be routed significant orders if the incumbent exchange does not react swiftly and effectively. The results are not limited to analyzing electronic exchanges but, we expect, to many situations where competing information technology platforms also benefit from user affiliation and network effects.
This paper analyzes the impact of e-commerce on markets where established firms face competition from Internet-based entrants with focused offerings. In particular, we study the retail brokerage sector where the growth of online brokerages and the availability of alternate sources of information and research services have challenged the dominance of traditional brokerages. We develop a stylized game-theoretic model to analyze the impact of competition between an incumbent full-service brokerage firm with a bundled offering of research services and trade execution and an online entrant offering just trade execution. We find that as consumers' willingness to pay for research declines, the incumbent finds it optimal to unbundle its offering when competing with the online entrant. We also find that the online entrant chooses a lower quality of trade execution when faced with direct competition from the incumbent's unbundled offering. The analytical model motivates a unique field experiment placing actual simultaneous trades with traditional full-service and online brokers, to compare order handling practices and the quality of trade execution. In keeping with our analytical results, our empirical findings show a significant difference in the quality of execution between online brokerages and their full-service counterparts. We discuss the relevance of our findings for quality differentiation, price convergence, and profit decline in a variety of markets where traditional incumbents are faced with changes in the competitive landscape as a result of e-commerce.
Clear and precise metrics are essential for evaluating phenomena such as e-commerce ('Net'-enablement) and the organizational use of networks and the Internet for commercial activities. Researchers require them for theory building and testing; practitioners require them for improving organizational processes. But for IS professionals to engage in serious creation of metrics, it is critical to recognize: (1) that the phenomenon of net-enablement is an enduring change, probably led in the future by 'brick-cum-click' firms, (2) that some new and old measures need to be differentially applied, and (3) that the papers in this special issue are not the end of metrics creation, but just the beginning.
Metrics are sine qua non for solid research, and scientific metrics have now been advanced with new approaches in the arena of Net-enablement (NE), otherwise known as e-commerce. Questions that likely require additional attention include: (1) Where/what is the real value in substituting information for physical processes?, (2) which NE systems effectively support end-to-end fulfillment?, and (3) when should a Net-enabled organization share information? With respect to extant studies in Net-enhancement, the field has been advanced in three methodological dimensions. Multiple methods have been used to validate measures. Approaches to metrics using archival/secondary data have also been initiated. Finally, strong external validity has been established through large scale data gathering.
The introduction of new screen-based systems for trading securities and futures contracts has led to the emergence of a "market for markets," and exchanges, broker-dealer firms, and market data vendors are competing to offer trade execution services that will attract customers and trading volumes. This competition is favored in the United States by regulatory bodies such as the SEC and the CFTC, which have taken steps such as encouraging the listing of equity options on multiple exchanges and approving the applications of screen-based systems for designation as contract markets. This paper examines the design of one screen-based futures market, the Cantor Financial Futures Exchange (CX), and describes its capabilities relative to the rival, floor-based market in Chicago. In comparison to traditional open-outcry mechanisms, the CX order-matching system maintains strict first in--first out time priority among submitted orders. Using a simple simulation model, we see that order matching leads to faster completion of desired trades and about a one-third reduction in transactions costs.
Financial markets perform many functions, but principal among them is to bring together buyers and sellers and to provide a mechanism for price discovery. Information technology has had a number of significant impacts on financial markets, enabling enormous increases in volumes and more sophisticated trading techniques, such as program trading and index arbitrage. Despite improvements, some large institutional investors identify shortcomings in today's markets that make the process of buying or selling large, block orders time-consuming and costly. To address these concerns, a new trading system, OptiMark, has been built around several innovations, including (1) a graphical user front end for depicting trading preferences, and (2) a back end built on high-performance computers that process expressions of trading interest according to a price-setting algorithm intended to achieve superior outcomes for traders. OptiMark provides a means for more cost-effective block trading and is expected to contribute to regulatory objectives. This paper details the operations of OptiMark, examines its adoption potential, and assesses the impact it may have on block trading, broker-dealer intermediaries, and the equities markets.
Several major securities markets including Nasdaq in the United States and the London Stock Exchange's SEAQ are organized as dealer markets that use computer screen displays of competitive dealer quotes to establish fair trade prices. To improve their markets and to reduce investors' trading costs, these exchanges are introducing new rules and systems for handling investors' orders. The redesign of a market structure raises important strategic issues for exchanges; more attractive trading mechanisms will increase order flow and improve liquidity, but margins and total profits earned by traditional exchange intermediaries may be reduced. To examine the consequences of market structure changes, we conducted experimental tests of the integration of an order-driven trading system into a dealer/quote-driven market. Using computer-based simulations of a stock market, experimental subjects traded using a traditional dealer quote screen to which a public limit order facility was added. Data captured on subjects' trading decisions revealed that the limit order system was used by the subjects, attracting some orders that would have otherwise gone to dealers, and lowered investor trading costs. The integration of limit orders reduced dealers' activities as a percentage of total market volume and lowered dealers' trading margins, except in a special "informed dealer" case.
Reasons for the mixed reactions to today's electronic off-exchange trading systems are examined, and regulatory implications are explored. Information technology (IT) could provide more automated markets, which have lower costs. Yet for an electronic trading system to form a liquid and widely used market, a sufficient number of traders would need to make a transition away from established trading venues and to this alternative way of trading. This transition may not actually occur for a variety of reasons. Two tests are performed of the feasibility and the desirability of transitions to new markets. In the first test, traders in a series of economic experiments demonstrate an ability to make a transition and develop a critical mass of trading activity in a newly opened market. In the second test, simulation is used to compare the floor- based specialist auction in place in most U.S. stock exchanges today to a disintermediated alternative employing screen-based order matching. The results indicate that reducing the role of dealer-intermediaries can actually diminish important measures of market quality. Our findings suggest that the Low trading volumes on many off-exchange systems do not result from traders' inability to break away from established trading floors. Rather, today's off-exchange trading systems are not uniformly superior to the trading mechanisms of traditional exchanges. Thus, regulatory actions favoring off-exchange trading systems are not warranted; but, improved designs for IT-based trading mechanisms are needed, and when these are available, they are likely to win significant trading volume from established exchanges.
New entrants in many industries are able to challenge the business of historically dominant firms. In many markets, dominant players have pursued pricing and service policies that, although once highly effective, now make their markets attractive targets for aggressive new entrants. The entrants' strategies rely on lower overhead costs, new technologies, alternative distribution channels, and the active targeting of profitable customers. Several factors will make it possible for entrants to attack dominant players; simplistic historical pricing mistakes or policies of promising or providing universal service will make it attractive for new entrants to attack. Restrictions on the flexibility of incumbents--both externally and internally imposed--may make it difficult for dominant players to defend themselves effectively against attack by more flexible entrants with cream-skimming strategies and newer technology. We develop a set of alternatives for incumbent firms facing increasing "contestability" in their markets and the threat of agile entrants.
Information technology (IT) radically alters the cost of capturing, storing, and analyzing information, and thus dramatically alters the value of the historical data represented by a firm's detailed transaction records of customer interactions. Yet, information systems are seldom used to their full potential in developing flexible pricing strategies and tailored offerings for individual customers, based either on the actual cost of serving these customers or on their demonstrated preferences and requirements. This will become increasingly crucial in industries with heterogeneous customers and with costs that vary widely across customers, in order to enable flexible pricing and to provide services that are accurately targeted at the needs of specific customer segments. In addition, accurate, detailed, and robust cost-accounting systems and expertise in the interpretation of performance data will increasingly become essential for competing successfully; those firms prevented from accurate microsegmentation by corporate culture and tradition, by regulation, or by an outmoded information infrastructure will be vulnerable to newer and more nimble competitors. In particular, being the low-cost provider with economies of scale will not provide adequate defense against targeted cream-skimming by opportunistic competitors, able to offer lower prices to selected customer segments. The earliest academic papers on the strategic implications of information technology explicitly adopted a framework recommending that firms adopt a single, simple generic strategy from a small set (cost leadership, differentiation, or niche). In contrast, recent experience suggests that IT may enable firms to select from more finely tuned strategic options, arid that it may require them to implement multiple strategies simultaneously.
Traditional management accounting systems are limited in their ability to provide profitability information relevant to management decisions. The problems of inadequate profitability measurement are intensified in today's business environments, where changing margins due to deregulation and new entrants, new products with unknown costs, and customer sophistication in locating low-cost providers often combine to leave unprepared firms with growing numbers of loss-making client relationships. In response, firms in a number of service and manufacturing industries are experimenting with new methods for measuring performance, and are implementing these techniques using information systems. The collection and analysis of information on the profitability of customer relationships enables managers to identify and defend their most attractive market segments, and to turn loss-making accounts into profitable ones. The London-based securities house, BZW, developed BEATRICE, an innovative information system that combines activity-based accounting principles and a model of customer profitability to make an income assignment to each of the 6,000 trades the firm makes in a day. The system's value is considerable, and can be evaluated by using industry performance benchmarks, and by comparing management decision making using the currently available information with what was possible with previous data.
As the strategic importance of information technology (IT) has increased, the decision of where and when to allocate resources to IT programs has become more risky and more difficult. Executives are tempted by the opportunities for strategic impact, but struggle with the massive expenditures and uncertainties involved. Evaluating the opportunity afforded by a system and judging its strategic impact in advance have proven difficult, and even when analyses are performed well, they are frequently done on an ad hoc basis. IT can confer advantage under appropriate conditions, and equally important, even when it fails to confer advantage, it may still prove crucial. Both concepts--competitive advantage and strategic necessity--confound traditional financial analysis. We offer seven principles on which to base an evaluation of a strategic IT venture. Although we have not performed a statistically validating study, these principles are expressed as guidelines we believe to be true, based on experience. The guidelines range from modeling the investment decision, through managing risk, to preparing for unanticipated upside and downside implications.